Tree-structured Gaussian Process Approximations

نویسندگان

  • Thang D. Bui
  • Richard E. Turner
چکیده

Gaussian process regression can be accelerated by constructing a small pseudodataset to summarize the observed data. This idea sits at the heart of many approximation schemes, but such an approach requires the number of pseudo-datapoints to be scaled with the range of the input space if the accuracy of the approximation is to be maintained. This presents problems in time-series settings or in spatial datasets where large numbers of pseudo-datapoints are required since computation typically scales quadratically with the pseudo-dataset size. In this paper we devise an approximation whose complexity grows linearly with the number of pseudo-datapoints. This is achieved by imposing a tree or chain structure on the pseudo-datapoints and calibrating the approximation using a Kullback-Leibler (KL) minimization. Inference and learning can then be performed efficiently using the Gaussian belief propagation algorithm. We demonstrate the validity of our approach on a set of challenging regression tasks including missing data imputation for audio and spatial datasets. We trace out the speed-accuracy trade-off for the new method and show that the frontier dominates those obtained from a large number of existing approximation techniques.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Perturbative corrections for approximate inference in Gaussian latent variable models

Expectation Propagation (EP) provides a framework for approximate inference. When the model under consideration is over a latent Gaussian field, with the approximation being Gaussian, we show how these approximations can systematically be corrected. A perturbative expansion is made of the exact but intractable correction, and can be applied to the model’s partition function and other moments of...

متن کامل

Tree-structured Gaussian process approximations Supplementary material

subject to q(f |u) = ∏ i q(fi|u) and ∫ dfiq(fi|u) = 1. It is noted that KL(a||b) is the measurement of information “lost” when using b to approximate a. It was argued in [1] that it is appropriate to use this KL divergence as an approximation measure since we are trying to find a sparse representation u and its relationship with f to approximate p by q. The KL divergence above can be expanded a...

متن کامل

Superfast Multifrontal Method for Large Structured Linear Systems of Equations

In this paper we develop a fast direct solver for large discretized linear systems using the supernodal multifrontal method together with low-rank approximations. For linear systems arising from certain partial differential equations such as elliptic equations, during the Gaussian elimination of the matrices with proper ordering, the fill-in has a low-rank property: all off-diagonal blocks have...

متن کامل

Structured Variational Inference for Coupled Gaussian Processes

Sparse variational approximations allow for principled and scalable inference in Gaussian Process (GP) models. In settings where several GPs are part of the generative model, these GPs are a posteriori coupled. For many applications such as regression where predictive accuracy is the quantity of interest, this coupling is not crucial. Howewer if one is interested in posterior uncertainty, it ca...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014